17 research outputs found

    The involutions-as-principal types/ application-as-unification analogy

    Get PDF
    In 2005, S. Abramsky introduced various universal models of computation based on Affine Combinatory Logic, consisting of partial involutions over a suitable formal language of moves, in order to discuss reversible computation in a game-theoretic setting. We investigate Abramsky\u2019s models from the point of view of the model theory of \u3bb-calculus, focusing on the purely linear and affine fragments of Abramsky\u2019s Combinatory Algebras. Our approach stems from realizing a structural analogy, which had not been hitherto pointed out in the literature, between the partial involution interpreting a combinator and the principal type of that term, with respect to a simple types discipline for \u3bb-calculus. This analogy allows for explaining as unification between principal types the somewhat awkward linear application of involutions arising from Geometry of Interaction (GoI). Our approach provides immediately an answer to the open problem, raised by Abramsky, of characterising those finitely describable partial involutions which are denotations of combinators, in the purely affine fragment. We prove also that the (purely) linear combinatory algebra of partial involutions is a (purely) linear \u3bb-algebra, albeit not a combinatory model, while the (purely) affine combinatory algebra is not. In order to check the complex equations involved in the definition of affine \u3bb-algebra, we implement in Erlang the compilation of \u3bb-terms as involutions, and their execution

    LF+ in Coq for fast-and-loose reasoning

    Get PDF
    We develop the metatheory and the implementation, in Coq, of the novel logical framework LF+ and discuss several of its applications. LF+ generalises research work, carried out by the authors over more than a decade, on Logical Frameworks conservatively extending LF and featuring lock-type constructors L-P(N:sigma)[center dot]. Lock-types capture monadically the concept of inhabitability up-to. They were originally introduced for factoring-out, postponing, or delegating to external tools the verification of time-consuming judgments, which are morally proof-irrelevant, thus allowing for integrating different sources of epistemic evidence in a unique Logical Framework. Besides introducing LF+ and its "shallow" implementation in Coq, the main novelty of the paper is to show that lock-types are also a very flexible tool for expressing in Type Theory several diverse cognitive attitudes and mental strategies used in ordinary reasoning, which essentially amount to reasoning up-to, as in e.g. Typical Ambiguity provisos or co-inductive Coq proofs. In particular we address the encoding of the emerging paradigm of fast-and-loose reasoning, which trades off efficiency for correctness. This paradigm, implicitly used normally in naive Set Theory, is producing considerable impact also in computer architecture and distributed systems, when branch prediction and optimistic concurrency control are implemented

    Synthesis, crystallographic characterization, and mechanical behavior of alumina chromia alloys

    Get PDF
    Powder mixtures of Alumina and Chromia, blended in different proportions (1, 3, 5 and 10%wt) by attrition milling, were fired either by pressureless sintering in air and hot pressing under vacuum. The resulting materials, characterized by X-ray diffraction, Raman spectroscopy, SEM, hardness and fracture toughness showed that all the compositions form complete solid solution which maintain the same crystal structures of corundum; chromia addition retards materials' densification of pressureless fired samples but not that of hot-pressed samples. Data from Raman spectroscopy and SEM/EDXS showed the appearance of Ti- and Mn-based impurities near the indentation print, in particular on fractured grains. The addition of chromia improves hardness, but does not affect toughness which is, on the other hand, greatly influenced by materials\u2019 residual porosity

    CHA2DS2-VASc Score Predicts Adverse Outcome in Patients with Simple Congenital Heart Disease Regardless of Cardiac Rhythm

    Get PDF
    Adult patients with simple congenital heart disease (sACHD) represent an expanding population vulnerable to atrial arrhythmias (AA). CHA2DS2-VASc score estimates thromboembolic risk in non-valvular atrial fibrillation patients. We investigated the prognostic role of CHA2DS2-VASc score in a non-selected sACHD population regardless of cardiac rhythm. Between November 2009 and June 2018, 427 sACHD patients (377 in sinus rhythm, 50 in AA) were consecutively referred to our ACHD service. Cardiovascular hospitalization and/or all-cause death were considered as composite primary end-point. Patients were divided into group A with CHA2DS2-VASc score = 0 or 1 point, and group B with a score greater than 1 point. Group B included 197 patients (46%) who were older with larger prevalence of cardiovascular risk factors than group A. During a mean follow-up of 70\ua0months (IQR 40\u201393), primary end-point occurred in 94 patients (22%): 72 (37%) in group B and 22 (10%, p < 0.001) in group A. Rate of death for all causes was also significantly higher in the group B than A (22% vs 2%, respectively, p < 0.001). Multivariable Cox regression analysis revealed that CHA2DS2-VASc score was independently related to the primary end-point (HR 1.84 [1.22\u20132.77], p = 0.004) together with retrospective AA, stroke/TIA/peripheral thromboembolism and diabetes. Furthermore, CHA2DS2-VASc score independently predicted primary end-point in the large subgroup of 377 patients with sinus rhythm (HR 2.79 [1.54\u20135.07], p = 0.01). In conclusion, CHA2DS2-VASc score accurately stratifies sACHD patients with different risk for adverse clinical events in the long term regardless of cardiac rhythm

    A dependent nominal type theory

    Full text link
    Nominal abstract syntax is an approach to representing names and binding pioneered by Gabbay and Pitts. So far nominal techniques have mostly been studied using classical logic or model theory, not type theory. Nominal extensions to simple, dependent and ML-like polymorphic languages have been studied, but decidability and normalization results have only been established for simple nominal type theories. We present a LF-style dependent type theory extended with name-abstraction types, prove soundness and decidability of beta-eta-equivalence checking, discuss adequacy and canonical forms via an example, and discuss extensions such as dependently-typed recursion and induction principles

    Internal Adequacy of Bookkeeping in Coq

    No full text
    We focus on a common problem encountered in encoding and formally reasoning about a wide range of formal systems, namely, the representation of a typing environment. In particular, we apply the bookkeeping technique to a well-known case study (i.e., System F<:'s type language), proving in Coq an internal correspondence with a more standard representation of the typing environment as a list of pairs. In order to keep the signature readable and concise, we make use of higher-order abstract syntax (HOAS), which allows us to deal smoothly with the representation of the universal binder of System F<: type language

    Competing risks between mortality and heart failure hospital re-admissions: a community-based investigation from the Trieste area

    No full text
    Predictors of mortality and readmission among patients hospitalized for heart failure (HF) were investigated in a large, unselected population of the Trieste area. The cohort of 4666 patients survived at the index admission in the period 2009-2014 was followed after discharge. Incidence of mortality and re-HF admission within 30 days and one year were computed, by comparing cumulative incidence probabilities with cause-specific Kaplan-Meier curves. Competing risks regression was used to find factors associated respectively with re-HF admission and death. Two distinct risk profiles were obtained, particularly for early outcomes, useful for better targeting treatment of these high-risk patients

    Consistency of the Theory of Contexts

    No full text
    The Theory of Contexts is a type-theoretic axiomatization aiming to give a metalogical account of the fundamental notions of variable and context as they appear in Higher Order Abstract Syntax. In this paper, we prove that this theory is consistent by building a model based on functor categories. By means of a suitable notion of forcing, we prove that this model validates Classical Higher Order Logic, the Theory of Contexts, and also (parametrised) structural induction and recursion principles over contexts. Our approach, which we present in full detail, should also be useful for reasoning on other models based on functor categories. Moreover, the construction could also be adopted, and possibly generalized, for validating other theories of names and binders

    Consistency of The Theory of Contexts

    No full text
    The Theory of Contexts is a type-theoretic axiomatization which has been recently proposed by some of the authors for giving a metalogical account of the fundamental notions of variable and context as they appear in Higher Order Abstract Syntax. In this paper, we prove that this theory is consistent by building a model based on functor categories. By means of a suitable notion of forcing, we prove that the model validates Classical Higher Order Logic, The Theory of Contexts, and also (parametrised) structural induction and recursion principles over contexts. The approach we present in full detail should be useful also for reasoning on other models based on functor categories. Moreover, the construction could be adopted, and possibly generalized, also for validating other theories of names and binders
    corecore